Product Code Database
Example Keywords: world of -picture $14
barcode-scavenger
   » » Wiki: Graphical Model
Tag Wiki 'Graphical Model'.
Tag

A graphical model or probabilistic graphical model ( PGM) or structured probabilistic model is a probabilistic model for which a graph expresses the conditional dependence structure between . Graphical models are commonly used in probability theory, —particularly Bayesian statistics—and .


Types
Generally, probabilistic graphical models use a graph-based representation as the foundation for encoding a distribution over a multi-dimensional space and a graph that is a compact or representation of a set of independences that hold in the specific distribution. Two branches of graphical representations of distributions are commonly used, namely, and Markov random fields. Both families encompass the properties of factorization and independences, but they differ in the set of independences they can encode and the factorization of the distribution that they induce.
(2026). 9780262013192, MIT Press. .


Undirected Graphical Model
The undirected graph shown may have one of several interpretations; the common feature is that the presence of an edge implies some sort of dependence between the corresponding random variables. From this graph, we might deduce that B, C, and D are all conditionally independent given A. This means that if the value of A is known, then the values of B, C, and D provide no further information about each other. Equivalently (in this case), the joint probability distribution can be factorized as:

PA,B,C,D = f_{AB}A,B \cdot f_{AC}A,C \cdot f_{AD}A,D

for some non-negative functions f_{AB}, f_{AC}, f_{AD}.


Bayesian network
If the network structure of the model is a directed acyclic graph, the model represents a factorization of the joint of all random variables. More precisely, if the events are X_1,\ldots,X_n then the joint probability satisfies

PX_1,\ldots,X_n=\prod_{i=1}^nPX_i|\text{pa}(X_i)

where \text{pa}(X_i) is the set of parents of node X_i (nodes with edges directed towards X_i). In other words, the joint distribution factors into a product of conditional distributions. For example, in the directed acyclic graph shown in the Figure this factorization would be

PA,B,C,D = PA\cdot PB\cdot PC \cdot PD|A,C.

Any two nodes are conditionally independent given the values of their parents. In general, any two sets of nodes are conditionally independent given a third set if a criterion called holds in the graph. Local independences and global independences are equivalent in Bayesian networks.

This type of graphical model is known as a directed graphical model, , or belief network. Classic machine learning models like hidden Markov models, neural networks and newer models such as variable-order Markov models can be considered special cases of Bayesian networks.

One of the simplest Bayesian Networks is the Naive Bayes classifier.


Cyclic Directed Graphical Models
The next figure depicts a graphical model with a cycle. This may be interpreted in terms of each variable 'depending' on the values of its parents in some manner. The particular graph shown suggests a joint probability density that factors as
PA,B,C,D = PA\cdot PB\cdot PC,D|A,B,
but other interpretations are possible.
(1996). 9781558604124, Morgan Kaufmann Pub..


Other types
  • Dependency network where cycles are allowed
  • Tree-augmented classifier or TAN model
  • Targeted Bayesian network learning (TBNL)
  • A is an undirected connecting variables and factors. Each factor represents a function over the variables it is connected to. This is a helpful representation for understanding and implementing belief propagation.
  • A or junction tree is a tree of cliques, used in the junction tree algorithm.
  • A is a graph which may have both directed and undirected edges, but without any directed cycles (i.e. if we start at any vertex and move along the graph respecting the directions of any arrows, we cannot return to the vertex we started from if we have passed an arrow). Both directed acyclic graphs and undirected graphs are special cases of chain graphs, which can therefore provide a way of unifying and generalizing Bayesian and Markov networks.
  • An is a further extension, having directed, bidirected and undirected edges.
  • techniques
    • A Markov random field, also known as a Markov network, is a model over an . A graphical model with many repeated subunits can be represented with .
    • A conditional random field is a discriminative model specified over an undirected graph.
  • A restricted Boltzmann machine is a specified over an undirected graph.
  • A staged tree is an extension of a Bayesian network for sequences of discrete valued events. They allow for context specific independences and non-product sample spaces.


Applications
The framework of the models, which provides algorithms for discovering and analyzing structure in complex distributions to describe them succinctly and extract the unstructured information, allows them to be constructed and utilized effectively. Applications of graphical models include , information extraction, speech recognition, , decoding of low-density parity-check codes, modeling of gene regulatory networks, gene finding and diagnosis of diseases, and graphical models for protein structure.


See also
  • Belief propagation
  • Structural equation model


Notes

Further reading

Books and book chapters

  • (2026). 9780387310732, Springer. .
  • (1999). 9780387987675, Springer.
    A more advanced and statistically oriented book
  • (1996). 9780387915029, Springer.
  • (1988). 9781558604797, . .
    A computational reasoning approach, where the relationships between graphs and probabilities were formally introduced.


Journal articles

Other


External links

Page 1 of 1
1
Page 1 of 1
1

Account

Social:
Pages:  ..   .. 
Items:  .. 

Navigation

General: Atom Feed Atom Feed  .. 
Help:  ..   .. 
Category:  ..   .. 
Media:  ..   .. 
Posts:  ..   ..   .. 

Statistics

Page:  .. 
Summary:  .. 
1 Tags
10/10 Page Rank
5 Page Refs
1s Time